perm filename EPIST.NOT[E78,JMC] blob sn#371353 filedate 1978-07-29 generic text, type C, neo UTF8
COMMENT āŠ—   VALID 00002 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	General notes on epistemology
C00004 ENDMK
CāŠ—;
General notes on epistemology

1. One of the biggest epistemological limitations is that we and our
machines cannot represent internally the state of the world or even
an object immediately being contemplated.  Neither can we simulate
the effects of an action.  Therefore, we keep our eyes open, e.g.
our mental processes include the effects of observation.
However, it seems possible that we can carry on internal mental
simulations that we can't verbalize.  Certainly people carry out
simulations that they don't verbalize.  The issue is whether these
processes, or rather the facts represented, can be conveniently
expressed as logical sentences or whether computers will require
quite other representations.  The key experiment would be to describe
a situation and ask people who can't see it whether a certain way
of achieving a goal would work.  Suppose that competent humans
independently arrive at the same ccrrect answer.  Is it possible
that the reasoning required to reach this answer cannot be represented
by a reasonably small set of reasonable sentences in logic together
with a reasonably small number of steps?